Backward-Forward Least Angle Shrinkage for Sparse Quadratic Optimization

نویسندگان

  • Tianyi Zhou
  • Dacheng Tao
چکیده

In compressed sensing and statistical society, dozens of algorithms have been developed to solve `1 penalized least square regression, but constrained sparse quadratic optimization (SQO) is still an open problem. In this paper, we propose backward-forward least angle shrinkage (BF-LAS), which provides a scheme to solve general SQO including sparse eigenvalue minimization. BF-LAS starts from the dense solution, iteratively shrinks unimportant variables’ magnitudes to zeros in the backward step for minimizing the `1 norm, decreases important variables’ gradients in the forward step for optimizing the objective, and projects the solution on the feasible set defined by the constraints. The importance of a variable is measured by its correlation w.r.t the objective and is updated via least angle shrinkage (LAS). We show promising performance of BF-LAS on sparse dimension reduction.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Field Guide to Forward-Backward Splitting with a FASTA Implementation

Non-differentiable and constrained optimization play a key role in machine learning, signal and image processing, communications, and beyond. For highdimensional minimization problems involving large datasets or many unknowns, the forward-backward splitting method (also known as the proximal gradient method) provides a simple, yet practical solver. Despite its apparent simplicity, the performan...

متن کامل

Linear-Quadratic Control of Backward Stochastic Differential Equations

This paper is concerned with optimal control of linear backward stochastic differential equations (BSDEs) with a quadratic cost criteria, or backward linear-quadratic (BLQ) control. The solution of this problem is obtained completely and explicitly by using an approach which is based primarily on the completion-of-squares technique. Two alternative, though equivalent, expressions for the optima...

متن کامل

Compressed sensing signal recovery via forward-backward pursuit

Recovery of sparse signals from compressed measurements constitutes an l0 norm minimization problem, which is unpractical to solve. A number of sparse recovery approaches have appeared in the literature, including l1 minimization techniques, greedy pursuit algorithms, Bayesian methods and nonconvex optimization techniques among others. This manuscript introduces a novel two stage greedy approac...

متن کامل

FAST SPLITTING ALGORITHMS FOR CONVEX OPTIMIZATION. BEYOND NESTEROV COMPLEXITY BOUND O(1/k)

Large scale optimization problems naturally appear in the modeling of many scientific and engineering situations. To meet the challenges posed by these issues, in recent years, considerable efforts have been devoted to the study of first-order splitting algorithms. The forward-backward algorithm, (also called the proximal-gradient algorithm) which is one of the most important, is a powerful too...

متن کامل

Forward and Backward Uncertainty Quantification in Optimization

This contribution gathers some of the ingredients presented during the Iranian Operational Research community gathering in Babolsar in 2019.It is a collection of several previous publications on how to set up an uncertainty quantification (UQ) cascade with ingredients of growing computational complexity for both forward and reverse uncertainty propagation.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010